The Rényi redundancy of generalized Huffman codes

نویسندگان

  • Anselm Blumer
  • Robert J. McEliece
چکیده

If optimality is measured by average codeword length, Huffman's algorithm gives optimal codes, and the redundancy can be measured as the difference between the average codeword length and Shannon's entropy. If the objective function is replaced by an exponentially weighted average, then a simple modification of Huffman's algorithm gives optimal codes. The redundancy can now be measured as the difference between this new average and Renyi's generalization of Shannon's entropy. By decreasing some of the codeword lengths in a Shannon code, the upper bound on the redundancy given in the standard proof of the noiseless source coding theorem is improved. The lower bound is improved by randomizing between codeword lengths, allowing linear programming techniques to be used on an integer programming problem. These bounds are shown to be asymptotically equal, providing a new proof of Kricevski's results on the redundancy of Huffman codes. These results are generalized to the Renyi case and are related to Gallager's bound on the redundancy of Huffman codes. PREVIOUS WORK I N 1961, Renyi [12] proposed that the Shannon entropy could be generalized to s+1 ( m ) H.( p) =-s-log i~l p}l 0, which approaches the Shannon entropy ass~ o+. In 1965, Campbell [1] showed that just as the Shannon entropy is a lower bound on the average codeword length of a uniquely decodable code, the Renyi entropy is a lower bound on the exponentially weighted average codeword length ~log ( i~l P;2sl;)' s > 0.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bounds on Generalized Huffman Codes

New lower and upper bounds are obtained for the compression of optimal binary prefix codes according to various nonlinear codeword length objectives. Like the coding bounds for Huffman coding — which concern the traditional linear code objective of minimizing average codeword length — these are in terms of a form of entropy and the probability of the most probable input symbol. As in Huffman co...

متن کامل

Redundancy-Related Bounds on Generalized Huffman Codes

This paper presents new lower and upper bounds for the compression rate of optimal binary prefix codes on memoryless sources according to various nonlinear codeword length objectives. Like the most well-known redundancy bounds for minimum (arithmetic) average redundancy coding — Huffman coding — these are in terms of a form of entropy and/or the probability of the most probable input symbol. Th...

متن کامل

Source Coding for Campbell’s Penalties

Given a probability vector, Huffman coding finds a corresponding prefix-free binary code that minimizes the mean codeword length. In this paper we explore situations in which the goals are different from those in Huffman coding. We explore a family of penalties (generalized means) proposed by Campbell [8], finding redundancy bounds for a common subfamily. We generalize an efficient algorithm fo...

متن کامل

A simple upper bound on the redundancy of Huffman codes

Upper bounds on the redundancy of Huffman codes have been extensively studied in the literature. Almost all of these bounds are in terms of the probability of either the most likely or the least likely source symbol. In this correspondence, we prove a simple upper bound in terms of the probability of any source symbol.

متن کامل

Decoding prefix codes

Minimum-redundancy prefix codes have been a mainstay of research and commercial compression systems since their discovery by David Huffman more than 50 years ago. In this experimental evaluation we compare techniques for decoding minimum-redundancy codes, and quantify the relative benefits of recently developed restricted codes that are designed to accelerate the decoding process. We find that ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 34  شماره 

صفحات  -

تاریخ انتشار 1988